Generalized information and entropy measures in physics
نویسنده
چکیده
The formalism of statistical mechanics can be generalized by starting from more general measures of information than the Shannon entropy and maximizing those subject to suitable constraints. We discuss some of the most important examples of information measures that are useful for the description of complex systems. Examples treated are the Rényi entropy, Tsallis entropy, Abe entropy, Kaniadakis entropy, Sharma-Mittal entropies, and a few more. Important concepts such as the axiomatic foundations, composability and Lesche stability of information measures are briefly discussed. Potential applications in physics include complex systems with long-range interactions and metastable states, scattering processes in particle physics, hydrodynamic turbulence, defect turbulence, optical lattices, and quite generally driven nonequilibrium systems with fluctuations of temperature. 1 How to measure information
منابع مشابه
Dynamic Bayesian Information Measures
This paper introduces measures of information for Bayesian analysis when the support of data distribution is truncated progressively. The focus is on the lifetime distributions where the support is truncated at the current age t>=0. Notions of uncertainty and information are presented and operationalized by Shannon entropy, Kullback-Leibler information, and mutual information. Dynamic updatings...
متن کاملINFORMATION MEASURES BASED TOPSIS METHOD FOR MULTICRITERIA DECISION MAKING PROBLEM IN INTUITIONISTIC FUZZY ENVIRONMENT
In the fuzzy set theory, information measures play a paramount role in several areas such as decision making, pattern recognition etc. In this paper, similarity measure based on cosine function and entropy measures based on logarithmic function for IFSs are proposed. Comparisons of proposed similarity and entropy measures with the existing ones are listed. Numerical results limpidly betoken th...
متن کاملPolyadic Entropy, Synergy and Redundancy among Statistically Independent Processes in Nonlinear Statistical Physics with Microphysical Codependence
The information shared among observables representing processes of interest is traditionally evaluated in terms of macroscale measures characterizing aggregate properties of the underlying processes and their interactions. Traditional information measures are grounded on the assumption that the observable represents a memoryless process without any interaction among microstates. Generalized ent...
متن کاملQuantile Approach of Generalized Cumulative Residual Information Measure of Order $(alpha,beta)$
In this paper, we introduce the concept of quantile-based generalized cumulative residual entropy of order $(alpha,beta)$ for residual and past lifetimes and study their properties. Further we study the proposed information measure for series and parallel system when random variable are untruncated or truncated in nature and some characterization results are presented. At the end, we study gene...
متن کاملCombinatorial Information Theory: I. Philosophical Basis of Cross-Entropy and Entropy
This study critically analyses the information-theoretic, axiomatic and combinatorial philosophical bases of the entropy and cross-entropy concepts. The combinatorial basis is shown to be the most fundamental (most primitive) of these three bases, since it gives (i) a derivation for the Kullback-Leibler cross-entropy and Shannon entropy functions, as simplified forms of the multinomial distribu...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2009